Structured Priors for Structure Learning

نویسندگان

  • Vikash K. Mansinghka
  • Charles Kemp
  • Thomas L. Griffiths
  • Joshua B. Tenenbaum
چکیده

Traditional approaches to Bayes net structure learning typically assume little regularity in graph structure other than sparseness. However, in many cases, we expect more systematicity: variables in real-world systems often group into classes that predict the kinds of probabilistic dependencies they participate in. Here we capture this form of prior knowledge in a hierarchical Bayesian framework, and exploit it to enable structure learning and type discovery from small datasets. Specifically, we present a nonparametric generative model for directed acyclic graphs as a prior for Bayes net structure learning. Our model assumes that variables come in one or more classes and that the prior probability of an edge existing between two variables is a function only of their classes. We derive an MCMC algorithm for simultaneous inference of the number of classes, the class assignments of variables, and the Bayes net structure over variables. For several realistic, sparse datasets, we show that the bias towards systematicity of connections provided by our model can yield more accurate learned networks than the traditional approach of using a uniform prior, and that the classes found by our model are appropriate.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Structured priors in visual working memory revealed through iterated learning

What hierarchical structures do people use to encode visual displays? We examined visual working memory’s priors for locations by asking participants to recall the locations of objects in an iterated learning task. We designed a nonparametric clustering algorithm that infers the clustering structure of objects and encodes individual items within this structure. Over many iterations, participant...

متن کامل

What Makes a Good Detector? - Structured Priors for Learning from Few Examples

Transfer learning can counter the heavy-tailed nature of the distribution of training examples over object classes. Here, we study transfer learning for object class detection. Starting from the intuition that “what makes a good detector” should manifest itself in the form of repeatable statistics over existing “good” detectors, we design a lowlevel feature model that can be used as a prior for...

متن کامل

Task-Driven Dictionary Learning for HyperspectralImage Classification with Structured SparsityConstraints Sparse representation models a signal as a linear combination of a small number of dictionary

Task-Driven Dictionary Learning for HyperspectralImage Classification with Structured SparsityConstraints Report Title Sparse representation models a signal as a linear combination of a small number of dictionary atoms. As a generative model, it requires the dictionary to be highly redundant in order to ensure both a stable high sparsity level and a low reconstruction error for the signal. Howe...

متن کامل

Sequential parameter learning and filtering in structured autoregressive state-space models

We present particle-based algorithms for sequential filtering and parameter learning in state-space autoregressive (AR) models with structured priors. Non-conjugate priors are specified on the AR coefficients at the system level by imposing uniform or truncated normal priors on the moduli and wavelengths of the reciprocal roots of the AR characteristic polynomial. Sequential Monte Carlo algorit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1206.6852  شماره 

صفحات  -

تاریخ انتشار 2006